Nearly Tight Oblivious Subspace Embeddings by Trace Inequalities

نویسنده

  • Michael B. Cohen
چکیده

We present a new analysis of sparse oblivious subspace embeddings, based on the ”matrix Chernoff” technique. These are probability distributions over (relatively) sparse matrices such that for any d-dimensional subspace of R, the norms of all vectors in the subspace are simultaneously approximately preserved by the embedding with high probability–typically with parameters depending on d but not on n. The families of embedding matrices considered here are essentially the same as those in [NN13], but with better parameters (sparsity and embedding dimension). Because of this, this analysis essentially serves as a “dropin replacement” for Nelson-Nguyen’s, improving bounds on its many applications to problems such as as least squares regression and low-rank approximation. This new method is based on elementary tail bounds combined with matrix trace inequalities (Golden-Thompson or Lieb’s theorem), and does not require combinatorics, unlike the Nelson-Nguyen approach. There are also variants of this method that are even simpler, at the cost of worse parameters. Furthermore, the bounds obtained are much tighter than previous ones, matching known lower bounds up to a single log(d) factor in embedding dimension (previous results had more log factors and also had suboptimal tradeoffs with sparsity). Thesis Supervisor: Jonathan Kelner Title: Associate Professor of Applied Mathematics

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tight Bounds for $\ell_p$ Oblivious Subspace Embeddings

An lp oblivious subspace embedding is a distribution over r × n matrices Π such that for any fixed n× d matrix A, Pr Π [for all x, ‖Ax‖p ≤ ‖ΠAx‖p ≤ κ‖Ax‖p] ≥ 9/10, where r is the dimension of the embedding, κ is the distortion of the embedding, and for an n-dimensional vector y, ‖y‖p = ( ∑n i=1 |yi|) 1/p is the lp-norm. Another important property is the sparsity of Π, that is, the maximum numbe...

متن کامل

Subspace Embeddings and ℓp-Regression Using Exponential Random Variables

Oblivious low-distortion subspace embeddings are a crucial building block for numerical linear algebra problems. We show for any real p, 1 ≤ p <∞, given a matrix M ∈ Rn×d with n d, with constant probability we can choose a matrix Π with max(1, n1−2/p)poly(d) rows and n columns so that simultaneously for all x ∈ R, ‖Mx‖p ≤ ‖ΠMx‖∞ ≤ poly(d)‖Mx‖p. Importantly, ΠM can be computed in the optimal O(n...

متن کامل

Subspace Embeddings and \(\ell_p\)-Regression Using Exponential Random Variables

Oblivious low-distortion subspace embeddings are a crucial building block for numerical linear algebra problems. We show for any real p, 1 ≤ p < ∞, given a matrix M ∈ R with n ≫ d, with constant probability we can choose a matrix Π with max(1, n)poly(d) rows and n columns so that simultaneously for all x ∈ R, ‖Mx‖p ≤ ‖ΠMx‖∞ ≤ poly(d)‖Mx‖p. Importantly, ΠM can be computed in the optimal O(nnz(M)...

متن کامل

Subspace Embeddings for the Polynomial Kernel

Sketching is a powerful dimensionality reduction tool for accelerating statistical learning algorithms. However, its applicability has been limited to a certain extent since the crucial ingredient, the so-called oblivious subspace embedding, can only be applied to data spaces with an explicit representation as the column span or row span of a matrix, while in many settings learning is done in a...

متن کامل

Lower Bounds for Oblivious Subspace Embeddings

An oblivious subspace embedding (OSE) for some ε, δ ∈ (0, 1/3) and d ≤ m ≤ n is a distribution D over Rm×n such that for any linear subspace W ⊂ Rn of dimension d, P Π∼D (∀x ∈W, (1− ε)‖x‖2 ≤ ‖Πx‖2 ≤ (1 + ε)‖x‖2) ≥ 1− δ. We prove that any OSE with δ < 1/3 must have m = Ω((d + log(1/δ))/ε2), which is optimal. Furthermore, if every Π in the support of D is sparse, having at most s non-zero entries...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016